Search
Optics

Portable Integrated Fourier Ptychographic Microscope
NASA’s integrated, portable FPM device combines advanced optical microscopy with AI in a compact form factor. At its core, the system uses a Raspberry Pi camera module equipped with an 8-megapixel sensor and a 3-mm focal-length lens, achieving approximately 1.5× magnification. Illumination is provided by a Unicorn HAT HD LED array positioned 65mm below the sample stage, creating a synthetic numerical aperture of 0.55. All these components are controlled by an NVIDIA Jetson Nano board, which serves as the system's embedded AI computing platform. NASA’s portable FPM device can be integrated with a microfluidic system for sub-micron imaging of liquid samples.
In addition to its portability, what sets NASA’s integrated FPM device apart is its integration of deep learning capabilities. The invention has two modes – normal mode and deep learning mode. In its normal mode, the system captures data and performs intensity and 3D phase analysis using traditional FPM methods. The deep learning mode enhances this base functionality by employing neural networks for image reconstruction and system optimization. The AI system automatically detects when samples are out of focus and can either mechanically or digitally adjust to the correct focal plane. To achieve near real-time monitoring, deep learning models significantly reduce data acquisition time by selectively using only a portion of the LED array and provide fast reconstruction capabilities leveraging training on specific sample types.
While originally designed for imaging biosignature motility in liquid samples for spaceflight applications, this NASA innovation can image many different types of samples, and is not limited to biological specimens. The ability to operate this system in the field further broadens use-cases.
robotics automation and control

Airborne Machine Learning Estimates for Local Winds and Kinematics
The MAchine learning ESTimations for uRban Operations (MAESTRO) system is a novel approach that couples commodity sensors with advanced algorithms to provide real-time onboard local wind and kinematics estimations to a vehicle's guidance and navigation system. Sensors and computations are integrated in a novel way to predict local winds and promote safe operations in dynamic urban regions where Global Positioning System/Global Navigation Satellite System (GPS/GNSS) and other network communications may be unavailable or are difficult to obtain when surrounded by tall buildings due to multi-path reflections and signal diffusion. The system can be implemented onboard an Unmanned Aerial Systems (UAS) and once airborne, the system does not require communication with an external data source or the GPS/GNSS. Estimations of the local winds (speed and direction) are created using inputs from onboard sensors that scan the local building environment. This information can then be used by the onboard guidance and navigation system to determine safe and energy-efficient trajectories for operations in urban and suburban settings. The technology is robust to dynamic environments, input noise, missing data, and other uncertainties, and has been demonstrated successfully in lab experiments and computer simulations.